2 00 6 Entropic criterion for model selection ∗

نویسنده

  • Chih-Yuan Tseng
چکیده

Model or variable selection is usually achieved through ranking models according to the increasing order of preference. One of methods is applying Kullback-Leibler distance or relative entropy as a selection criterion. Yet that will raise two questions, why uses this criterion and are there any other criteria. Besides, conventional approaches require a reference prior, which is usually difficult to get. Following the logic of inductive inference proposed by Caticha [1], we show relative entropy to be a unique criterion, which requires no prior information and can be applied to different fields. We examine this criterion by considering a physical problem, simple fluids, and results are promising. Keyword : Model selection, Inductive inference, Kullback-Leibler distance, Relative entropy, Probability model PACS : 02.50.Sk, 02.50.Tt, 02.50.Le

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

ar X iv : p hy si cs / 0 10 40 27 v 1 5 A pr 2 00 1 DNA Segmentation as A Model Selection Process ∗

Previous divide-and-conquer segmentation analyses of DNA sequences do not provide a satisfactory stopping criterion for the recursion. This paper proposes that segmentation be considered as a model selection process. Using the tools in model selection, a limit for the stopping criterion on the relaxed end can be determined. The Bayesian information criterion, in particular, provides a much more...

متن کامل

Positive maps, majorization, entropic inequalities, and detection of entanglement

In this paper, we discuss some general connections between the notions of positive map, weak majorization and entropic inequalities in the context of detection of entanglement among bipartite quantum systems. First, basing on the fact that any positive map Λ : Md(C) → Md(C) can be written as the difference between two completely positive maps Λ = Λ1 − Λ2, we propose a possible way to generalize...

متن کامل

2 00 7 The Residual Information Criterion , Corrected

Shi and Tsai (JRSSB, 2002) proposed an interesting residual information criterion (RIC) for model selection in regression. Their RIC was motivated by the principle of minimizing the Kullback-Leibler discrepancy between the residual likelihoods of the true and candidate model. We show, however, under this principle, RIC would always choose the full (saturated) model. The residual likelihood ther...

متن کامل

Akaike's Information Criterion and Recent Developments in Information Complexity.

In this paper we briefly study the basic idea of Akaike's (1973) information criterion (AIC). Then, we present some recent developments on a new entropic or information complexity (ICOMP) criterion of Bozdogan (1988a, 1988b, 1990, 1994d, 1996, 1998a, 1998b) for model selection. A rationale for ICOMP as a model selection criterion is that it combines a badness-of-fit term (such as minus twice th...

متن کامل

ar X iv : a st ro - p h / 04 03 47 2 v 1 1 9 M ar 2 00 4 Nonextensivity and Galaxy Clustering in the Universe

We investigate two important questions about the use of the nonextensive thermo-statistics (NETS) formalism in the context of nonlinear galaxy clustering in the Universe. Firstly, we define a quantitative criterion for justifying nonextensivity at different physical scales. Then, we discuss the physics behind the ansatz of the entropic parameter q(r). Our results suggest the approximate range w...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006